A relaxed customized proximal point algorithm for separable convex programming
نویسندگان
چکیده
The alternating direction method (ADM) is classical for solving a linearly constrained separable convex programming problem (primal problem), and it is well known that ADM is essentially the application of a concrete form of the proximal point algorithm (PPA) (more precisely, the Douglas-Rachford splitting method) to the corresponding dual problem. This paper shows that an efficient method competitive to ADM can be easily derived by applying PPA directly to the primal problem. More specifically, if the proximal parameters are chosen judiciously according to the separable structure of the primal problem, the resulting customized PPA takes a similar decomposition algorithmic framework as that of ADM. The customized PPA and ADM are equally effective to exploit the separable structure of the primal problem, equally efficient in numerical senses and equally easy to implement. Moreover, the customized PPA is ready to be accelerated by an over-relaxation step, yielding a relaxed customized PPA for the primal problem. We verify numerically the competitive efficiency of the customized PPA to ADM, and the effectiveness of the over-relaxation step. Furthermore, we provide a simple proof for the O(1/t) convergence rate of the relaxed customized PPA.
منابع مشابه
A parameterized proximal point algorithm for separable convex optimization
In this paper, we develop a parameterized proximal point algorithm (PPPA) for solving a class of separable convex programming problems subject to linear and convex constraints. The proposed algorithm is provable to be globally convergent with a worst-case O(1/t) convergence rate, where t denotes the iteration number. By properly choosing the algorithm parameters, numerical experiments on solvin...
متن کاملOn Relaxation of Some Customized Proximal Point Algorithms for Convex Minimization: From Variational Inequality Perspective
The proximal point algorithm (PPA) is a fundamental method for convex programming. When PPA applied to solve linearly constrained convex problems, we may prefer to choose an appropriate metric matrix to define the proximal regularization, so that the computational burden of the resulted PPA can be reduced, and in most cases, even admit closed form or efficient solutions. This idea results in th...
متن کاملCustomized proximal point algorithms for linearly constrained convex minimization and saddle-point problems: a unified approach
This paper takes a uniform look at the customized applications of proximal point algorithm (PPA) to two classes of problems: the linearly constrained convex minimization problem with a generic or separable objective function and a saddle-point problem. We model these two classes of problems uniformly by a mixed variational inequality, and show how PPA with customized proximal parameters can yie...
متن کاملRelatively Relaxed Proximal Point Algorithms for Generalized Maximal Monotone Mappings and Douglas-Rachford Splitting Methods
The theory of maximal set-valued monotone mappings provide a powerful framework to the study of convex programming and variational inequalities. Based on the notion of relatively maximal relaxed monotonicity, the approximation solvability of a general class of inclusion problems is discussed, while generalizing most of investigations on weak convergence using the proximal point algorithm in a r...
متن کاملSuper-Relaxed (η)-Proximal Point Algorithms, Relaxed (η)-Proximal Point Algorithms, Linear Convergence Analysis, and Nonlinear Variational Inclusions
We glance at recent advances to the general theory of maximal set-valued monotone mappings and their role demonstrated to examine the convex programming and closely related field of nonlinear variational inequalities. We focus mostly on applications of the super-relaxed η proximal point algorithm to the context of solving a class of nonlinear variational inclusion problems, based on the notion ...
متن کامل